Release Notes for Q3 2023

Explore the new features added in this update!

Updated in: November 2023

Release Version: 0.15

Feature Description
Product portfolios, products, release trains, and dashboards can be made public Product portfolios, products, release trains, and dashboards can be made publicly available. For example, if you mark a product as public, then it is visible under the All Products tab on the Products screen.
Jenkins CI/CD pipeline support is now available for a Databricks custom transformation job

Jenkins CI/CD pipeline is used to check for any vulnerabilities or syntax errors in the code written for a Databricks custom transformation job, by performing code compilation and code scans. This way you can ensure the quality of the code written for a custom transformation job.

Support for Lazsa Orchestrator Agent , AWS Secrets Manager, and Azure Key Vault extended for various data integration, data transformation, and data quality tools

Support for Lazsa Orchestrator Agent, AWS Secrets Manager, and Azure Key Vault is now extended to the following tools used for data integration, data transformation, and data quality:

  • Databricks

  • Snowflake

  • RDBMS

    • MySQL

    • Microsoft SQL Server

    • Oracle

    • Snowflake

  • Qlik Sense

Amazon Redshift now supported as a data source and also to create a crawler and catalog You can now use Amazon Redshift as a data source while using Databricks for data integration to load data into an Amazon S3 lake. You can also create a data crawler and a data catalog using Amazon Redshift as a source. 
Snowflake transformation jobs now support uploading a custom script To facilitate the execution of bulk commands in a transformation job, support is now added in the UI for uploading a custom script in the transformation step of a Snowflake custom transformation job.
Usability improvements in Data Pipeline Studio (DPS) home screen and job configuration screen. On the home screen of DPS, you can now collapse the stages on the left side of the screen, that you use to create your data pipeline in order to create a wider view for a complex pipeline. Similarly, when you create jobs in DPS, you can collapse the left pane of a configuration screen for a wider view.
Pipeline chaining is now possible between pipelines in Data Pipeline Studio Using the pipeline chaining feature, it is possible to initiate the pipeline run of one pipeline based on a specific event like success or failure of another pipeline run, using the Schedule Run feature. This helps in automating the pipeline run for multiple pipelines.
Rejected records can now be stored at a specified location When data goes through the data quality stage, some records can get rejected due to failure in meeting the specified criteria. Data Pipeline Studio now supports storing such rejected records at a specific location. The user can create or choose a folder and subfolder on the same datastore that he has selected for the target.
Support for upsert operation is available with Parquet and Delta table format An upsert operation is a combination of updating an existing record or inserting a new row to add the record. Data Pipeline Studio now supports the upsert operation for parquet and delta table format during data validation.
Approval workflow implemented for data pipelines Data Pipeline Studio now supports implementing an approval workflow for approving the data in a data pipeline. A new stage Data Workflow is added to DPS. You must add a workflow node to the Data Workflow stage. Data from an AWS S3 lake is downloaded, corrected, approved and then uploaded back to AWS S3.
Data validation rules added in Issue Resolver stage Data validation rules are now added that include dragging validation columns to set the priority of validation checks, ordering/sorting data in ascending or descending order, and performing string operations like LTrim, RTrim, LPad, and RPad.
Single or multiple file support now available for SFTP and FTP data sources When you use FTP or SFTP as a data source for data ingestion into a Snowflake data lake, you can select a single file or you can select a folder with multiple files using the filter option for file formats like CSV, JSON, XLSX, PARQUET.
Secrets management tool support for FTP and SFTP data sources Lazsa Platform now provides secrets management tool support for FTP and SFTP data sources. This means you can securely fetch your FTP or SFTP server credentials stored in AWS Secrets Manager or Azure Key Vault to establish a connection with the Lazsa Platform, enabling the use of FTP or SFTP as data sources in your data ingestion pipelines.
Support for data ingestion of unstructured data using FTP or SFTP as a data source Support is now available for data ingestion of unstructured data into an S3 data lake using FTP or SFTP as a data source. You can select file format Other and then browse to select a single file or multiple files of different formats. Supported file formats are JPG, PNG, ZIP, TEXT, DOC, PDF, MP3, MP4.
Support for new technologies

The following new technologies are now integrated with the Lazsa Platform:

  • Java 18 with Spring Boot 3.1.3 - Maven

  • Java 18 with Spring Boot 3.1.3 - Gradle

  • Java 18 with GraphQL Spring Boot 3.1.3 - Gradle

  • Java 18 with GraphQL Spring Boot 3.1.3 - Maven

  • Java 17 with Spring Boot 3.1.3

  • React 18.2

  • Python 3.11.5

Cloud tag support in cloud account configuration You can now add tags for your AWS and Azure cloud resources that are created from your cloud platform account configuration screen in Lazsa. While creating a new Virtual Private Cloud (VPC) or a new key pair from Lazsa, you can specify your desired cloud tags in key-value pairs in the connection details of your cloud account. These tags are applied to resources such as virtual network, subnets, route tables, internet gateway, SSH keys and others that are created from the AWS or Azure configuration screen in Lazsa. Support for cloud tags helps you better organize and track your cloud resources.
Approval workflow implemented for publishing policy template Approval workflow for publishing a policy template is now available in Lazsa. If you enable this workflow, a template can be published only after it is thoroughly reviewed and approved by a designated approver. This helps you ensure greater control and compliance in your policy template management.
Approval workflow implemented for adding technologies Approval workflow for adding technologies is now available in Lazsa. If you enable this workflow, users can add technologies for feature development only after they are thoroughly reviewed and approved by a designated approver. This helps you control, standardize, and optimize the usage of technology stacks and ensure compliance with the regulatory standards of your organization.
Role-based access control (RBAC) implemented for Develop and Deploy phases RBAC is implemented for the Develop and Deploy phases in the platform. Based on the assigned roles and their corresponding permissions, you can now manage and control user access and actions such as adding, deploying, and deleting technologies; adding, editing, and deleting a stage; configuring instances and clusters, and managing Terraform runs, among others. RBAC provides you with flexibility to define and control who can do what as required by your organization.
Improved user interface for access management for saved configurations We have improved the interface for access management for tools configurations with clearer request statuses, ability for users to cancel their own access requests, notifications about new access requests and the number of pending requests for administrators, date & time of access requested in the list of pending requests, name of the approver with time of approval, and the configuration access history for better tracking.
Extended role-based access control (RBAC) across the platform RBAC now extends to various areas such as dashboards, portfolios, products, and release trains. We've optimized system-defined roles and permissions.
Enhanced access management user interface for Databases and Data Warehouses configurations

The addition or removal of users or teams in a database or data warehouse configuration is now logged on the History tab of access management screen for the configuration.

On the Databases and Data Warehouses screen, you can now sort the saved configurations by date of creation (newest to oldest) and the number of pending access requests in descending order for the configuration.

This improved interface enhances the overall user experience by simplifying access management and ensuring better visibility into configuration changes.